557 research outputs found
Thoughts on Siponen and Klaarvuniemiâs âDemystifying Beliefs about the Natural Sciences in ISâ: The way forward
This is a comment on the paper by Siponen and Klaarvuniemi concerning the natural sciences. It argues that many of their points are correct but have been made before, particularly within critical realism. It suggests that the way forward is via a âmechanismsâ view of natural (and social) science
Information and Computation
In this chapter, concepts related to information and computation are reviewed
in the context of human computation. A brief introduction to information theory
and different types of computation is given. Two examples of human computation
systems, online social networks and Wikipedia, are used to illustrate how these
can be described and compared in terms of information and computation.Comment: 9 pages, 3 figures. Draft of a chapter to be published in Michelucci,
P. (Ed.) Handbook of Human Computation, Springe
Epilogue: systems approaches and systems practice
Each of the five systems approaches discussed in this volume: system dynamics (SD), the viable systems model (VSM), strategic options development and analysis (SODA), soft systems methodology (SSM) and critical systems heuristics (CSH) has a pedigree. Not in the sense of the sometimes absurd spectacle of animals paraded at dog shows. Rather, their pedigree derives from their systems foundations, their capacity to evolve and their flexibility in use. None of the five approaches has developed out of use in restricted and controlled contexts of either low or high levels of complicatedness. Neither has any one of them evolved as a consequence of being applied only to situations with either presumed stakeholder agreement on purpose, or courteous disagreement amongst stakeholders, or stakeholder coercion. The compilation is not a celebration of abstract âmethodologiesâ, but of theoretically robust approaches that have a genuine pedigree in practice
Uncovering regulatory pathways that affect hematopoietic stem cell function using 'genetical genomics'
We combined large-scale mRNA expression analysis and gene mapping to identify genes and loci that control hematopoietic stem cell (HSC) function. We measured mRNA expression levels in purified HSCs isolated from a panel of densely genotyped recombinant inbred mouse strains. We mapped quantitative trait loci (QTLs) associated with variation in expression of thousands of transcripts. By comparing the physical transcript position with the location of the controlling QTL, we identified polymorphic cis-acting stem cell genes. We also identified multiple trans-acting control loci that modify expression of large numbers of genes. These groups of coregulated transcripts identify pathways that specify variation in stem cells. We illustrate this concept with the identification of candidate genes involved with HSC turnover. We compared expression QTLs in HSCs and brain from the same mice and identified both shared and tissue-specific QTLs. Our data are accessible through WebQTL, a web-based interface that allows custom genetic linkage analysis and identification of coregulated transcripts.
Defining a General Structure of Four Inferential Processes by Means of Four Pairs of Choices Concerning Two Basic Dichotomies
In previous papers I have characterized four ways of reasoning in Peirceâs philosophy, and four ways of
reasoning in Computability Theory. I have established their correspondence on the basis of the four pairs
of choices regarding two dichotomies, respectively the dichotomy between two kinds of Mathematics and
the dichotomy between two kinds of Logic. In the present paper I introduce four principles of reasoning in
theoretical Physics and I interpret also them by means of the four pairs of choices regarding the above two
dichotomies. I show that there exists a meaningful correspondence among the previous three fourfold sets
of elements. This convergence of the characteristic ways of reasoning within three very different fields of
research - Peirceâs philosophy, Computability theory and physical theories - suggests that there exists a
general-purpose structure of four ways of reasoning. This structure is recognized as applied by Mendeleev
when he built his periodic table. Moreover, it is shown that a chemist-, applies all the above ways of
reasoning at the same time. Peirceâs professional practice as a chemist applying at the same time this
variety of reasoning explains his stubborn research into the variety of the possible inferences
Chapter 7 Epilogue: Systems Approaches and Systems Practice
Each of the five systems approaches discussed in this volume: system dynamics (SD), the viable systems model (VSM), strategic options development and analysis (SODA), soft systems methodology (SSM) and critical systems heuristics (CSH) has a pedigree. Not in the sense of the sometimes absurd spectacle of animals paraded at dog shows. Rather, their pedigree derives from their systems foundations, their capacity to evolve and their flexibility in use. None of the five approaches has developed out of use in restricted and controlled contexts of either low or high levels of complicatedness. Neither has any one of them evolved as a consequence of being applied only to situations with either presumed stakeholder agreement on purpose, or courteous disagreement amongst stakeholders, or stakeholder coercion. The compilation is not a celebration of abstract methodologiesâ, but of theoretically robust approaches that have a genuine pedigree in practice
Decision curve analysis revisited: overall net benefit, relationships to ROC curve analysis, and application to case-control studies
ABSTRACT: BACKGROUND: Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. METHODS: We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. RESULTS: We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. CONCLUSIONS: We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application
Universal Probability-Free Conformal Prediction
We construct universal prediction systems in the spirit of Popper's
falsifiability and Kolmogorov complexity and randomness. These prediction
systems do not depend on any statistical assumptions (but under the IID
assumption they dominate, to within the usual accuracy, conformal prediction).
Our constructions give rise to a theory of algorithmic complexity and
randomness of time containing analogues of several notions and results of the
classical theory of Kolmogorov complexity and randomness.Comment: 27 page
The discourse of Olympic security 2012 : London 2012
This paper uses a combination of CDA and CL to investigate the discursive realization of the security operation for the 2012 London Olympic Games. Drawing on Didier Bigoâs (2008) conceptualisation of the âbanopticonâ, it address two questions: what distinctive
linguistic features are used in documents relating to security for London 2012; and, how is Olympic security realized as a discursive practice in these documents? Findings suggest that the documents indeed realized key banoptic features of the banopticon: exceptionalism, exclusion and prediction, as well as what we call âpedagogisationâ. Claims were made for the
exceptional scale of the Olympic events; predictive technologies were proposed to assess the
threat from terrorism; and documentary evidence suggests that access to Olympic venues
was being constituted to resemble transit through national boundarie
- âŠ